- probabilistic averaging
- вероятностное усреднение
The English-Russian dictionary on reliability and quality control. 2015.
The English-Russian dictionary on reliability and quality control. 2015.
Probabilistic analysis of algorithms — In analysis of algorithms, probabilistic analysis of algorithms is an approach to estimate the computational complexity of an algorithm or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all… … Wikipedia
Information retrieval — This article is about information retrieval in general. For the fictional government department, see Brazil (film). Information retrieval (IR) is the area of study concerned with searching for documents, for information within documents, and for… … Wikipedia
PP (complexity) — In complexity theory, PP is the class of decision problems solvable by a probabilistic Turing machine in polynomial time, with an error probability of less than 1/2 for all instances. The abbreviation PP refers to probabilistic polynomial time.… … Wikipedia
Amortized analysis — In computer science, especially analysis of algorithms, amortized analysis refers to finding the average running time per operation over a worst case sequence of operations. Amortized analysis differs from average case performance in that… … Wikipedia
Subjective logic — is a type of probabilistic logic that explicitly takes uncertainty and belief ownership into account. In general, subjective logic is suitable for modeling and analysing situations involving uncertainty and incomplete knowledgeA. Jøsang.… … Wikipedia
Bell's theorem — is a theorem that shows that the predictions of quantum mechanics (QM) are not intuitive, and touches upon fundamental philosophical issues that relate to modern physics. It is the most famous legacy of the late physicist John S. Bell. Bell s… … Wikipedia
Quantum mechanics — For a generally accessible and less technical introduction to the topic, see Introduction to quantum mechanics. Quantum mechanics … Wikipedia
Uncertainty principle — In quantum physics, the Heisenberg uncertainty principle states that locating a particle in a small region of space makes the momentum of the particle uncertain; and conversely, that measuring the momentum of a particle precisely makes the… … Wikipedia
Segmentation (image processing) — In computer vision, segmentation refers to the process of partitioning a digital image into multiple regions (sets of pixels). The goal of segmentation is to simplify and/or change the representation of an image into something that is more… … Wikipedia
Level of measurement — The levels of measurement , or scales of measure are expressions that typically refer to the theory of scale types developed by the psychologist Stanley Smith Stevens. Stevens proposed his theory in a 1946 Science article titled On the theory of… … Wikipedia
Bootstrapping (statistics) — In statistics, bootstrapping is a modern, computer intensive, general purpose approach to statistical inference, falling within a broader class of resampling methods.Bootstrapping is the practice of estimating properties of an estimator (such as… … Wikipedia